8 MGF, Convolution

1 MGF

MGF

The moment generating function (MGF) of a RV X is a function MX:R[0,+) given by MX(t)=E[etX],tR.

  1. If X1,,Xn are independent, Sn=X1++Xn, then MSn(t)=k=1nMXk(t).
  2. We can show quickly that MαX+β(t)=E[e(αX+β)t]=eβtE[eαXt]=eβtMX(αt).

MGF is very important in calculating moments in that

Theorem

If MX(t)<,t(ε,ε), then

  1. dkdtkMX(t)|t=0=E[Xk].
  2. For t within the radius of convergence, MX(t)=k=0E[Xk]tkk!.
Theorem (Uniqueness)

Suppose X,Y are two RVs with well defined MGFs. If MX(t)=MY(t),t(ε,ε), then X=dY.

Recall discussion on convergence in distribution.

Theorem (Convergence in Distribution/Continuity Theorem)

Suppose X1,X2, is a sequence of RVs with MGF MXn(t) well defined for t(ε,ε). If MXn(t)M(t),t(ε,ε), then M(t)=MX(t), where MX is the MGF of a RV X s.t. XndX.


Recall Bernoulli process: P(success)=p,P(failure)=1p. Tr is total number of trials until the r th success. Fr is total number of failures until the r th success. Fr+r=Tr. We showed that FrNegativeBinomial(r,p). And Tr=W1++Wr, where W1,,Wri.i.dGeometric(p). So by application 1, MTr(t)=[pet1(1p)et]r.
So MFr(t)=E[etFr]=etrE[etTr]=[p1(1p)et]r.
Let Xn=Fr,nn, where Fr,nNegativeBinomial(r,λn), then MXn(t)=E[etFr,nn]=[λn1(1λn)etn](λλt)r.
Now let X=Y1++Yr, where Y1,,Yri.i.dExp(λ). Then XGamma(r,λ), with p.d.f fX(x)=λrΓ(r)xr1eλx, and MX(t)=i=1nMYi(t)=(λλt)r.
So XndX.

2 Convolution

Convolution can be applied to analyze sum of random variables.
Let X,Y be discrete RVs on the same probability space. We want to find P(X+Y=c). Then P(X+Y=c)=(a,b):a+b=cP(X=a,Y=b)=aP(X=a,Y=ca).
We assume XY, then P(X+Y=c)=aP(X=a)P(Y=ca).
This is called convolution.

For continuous RVs, similarly we have fX+Y(z)=+fX(x)fY(zx)dx.


We use another example to show the application of both MGF and convolution.


  1. Note that the concrete expression is NegativeBinomial(r,λn)/n. ↩︎